70 research outputs found

    Not all errors are alike : modulation of error-related neural responses in musical joint action

    Get PDF
    During joint action, the sense of agency enables interaction partners to implement corrective and adaptive behaviour in response to performance errors. When agency becomes ambiguous (e.g. when action similarity encourages perceptual self– other overlap), confusion as to who produced what may disrupt this process. The current experiment investigated how ambiguity of agency affects behavioural and neural responses to errors in a joint action domain where self–other overlap is common: musical duos. Pairs of pianists performed piano pieces in synchrony, playing either the same pitches (ambiguous agency) or different pitches (unambiguous agency) while electroencephalography (EEG) was recorded for each individual. Behavioural and event-related potential results showed no effects of the agency manipulation but revealed differences in how distinct error types are processed. Self-produced ‘wrong note’ errors (substitutions) were left uncorrected, showed post-error slowing and elicited an error-related negativity (ERN) peaking before erroneous keystrokes (pre-ERN). In contrast, self-produced ‘extra note’ errors (additions) exhibited pre-error slowing, error and post-error speeding, were rapidly corrected and elicited the ERN. Other-produced errors evoked a feedback-related negativity but no behavioural effects. Overall findings shed light upon how the nervous system supports fluent interpersonal coordination in real-time joint action by employing distinct mechanisms to manage different types of errors

    Electrophysiological activity from over the cerebellum and cerebrum during eye blink conditioning in human subjects

    Get PDF
    We report the results of an experiment in which electrophysiological activity was recorded from the human cerebellum and cerebrum in a sample of 14 healthy subjects before, during and after a classical eye blink conditioning procedure with an auditory tone as conditional stimulus and a maxillary nerve unconditional stimulus. The primary aim was to show changes in the cerebellum and cerebrum correlated with behavioral ocular responses. Electrodes recorded EMG and EOG at peri-ocular sites, EEG from over the frontal eye-fields and the electrocerebellogram (ECeG) from over the posterior fossa. Of the 14 subjects half strongly conditioned while the other half were resistant. We confirmed that conditionability was linked under our conditions to the personality dimension of extraversion-introversion. Inhibition of cerebellar activity was shown prior to the conditioned response, as predicted by Albus (1971). However, pausing in high frequency ECeG and the appearance of a contingent negative variation (CNV) in both central leads occurred in all subjects. These led us to conclude that while conditioned cerebellar pausing may be necessary, it is not sufficient alone to produce overt behavioral conditioning, implying the existence of another central mechanism. The outcomes of this experiment indicate the potential value of the noninvasive electrophysiology of the cerebellum

    Neural tracking and integration of 'self' and 'other' in improvised interpersonal coordination

    Get PDF
    Humans coordinate their movements with one another in a range of everyday activities and skill domains. Optimal joint performance requires the continuous anticipation of and adaptation to each other's movements, especially when actions are spontaneous rather than pre-planned. Here we employ dual-EEG and frequency-tagging techniques to investigate how the neural tracking of self- and other-generated movements supports interpersonal coordination during improvised motion. LEDs flickering at 5.7 and 7.7 Hz were attached to participants’ index fingers in 28 dyads as they produced novel patterns of synchronous horizontal forearm movements. EEG responses at these frequencies revealed enhanced neural tracking of self-generated movement when leading and of other-generated movements when following. A marker of self-other integration at 13.4 Hz (inter-modulation frequency of 5.7 and 7.7 Hz) peaked when no leader was designated, and mutual adaptation and movement synchrony were maximal. Furthermore, the amplitude of EEG responses reflected differences in the capacity of dyads to synchronize their movements, offering a neurophysiologically grounded perspective for understanding perceptual-motor mechanisms underlying joint action. © 2019 Elsevier Inc

    Brain networks for temporal adaptation, anticipation, and sensory-motor integration in rhythmic human behavior

    Get PDF
    Human interaction often requires the precise yet flexible interpersonal coordination of rhythmic behavior, as in group music making. The present fMRI study investigates the functional brain networks that may facilitate such behavior by enabling temporal adaptation (error correction), prediction, and the monitoring and integration of information about ‘self’ and the external environment. Participants were required to synchronize finger taps with computer-controlled auditory sequences that were presented either at a globally steady tempo with local adaptations to the participants' tap timing (Virtual Partner task) or with gradual tempo accelerations and decelerations but without adaptation (Tempo Change task). Connectome-based predictive modelling was used to examine patterns of brain functional connectivity related to individual differences in behavioral performance and parameter estimates from the adaptation and anticipation model (ADAM) of sensorimotor synchronization for these two tasks under conditions of varying cognitive load. Results revealed distinct but overlapping brain networks associated with ADAM-derived estimates of temporal adaptation, anticipation, and the integration of self-controlled and externally controlled processes across task conditions. The partial overlap between ADAM networks suggests common hub regions that modulate functional connectivity within and between the brain's resting-state networks and additional sensory-motor regions and subcortical structures in a manner reflecting coordination skill. Such network reconfiguration might facilitate sensorimotor synchronization by enabling shifts in focus on internal and external information, and, in social contexts requiring interpersonal coordination, variations in the degree of simultaneous integration and segregation of these information sources in internal models that support self, other, and joint action planning and prediction

    Neural tracking of the musical beat is enhanced by low-frequency sounds

    Get PDF
    Music makes us move, and using bass instruments to build the rhythmic foundations of music is especially effective at inducing people to dance to periodic pulse-like beats. Here, we show that this culturally widespread practice may exploit a neurophysiological mechanism whereby low-frequency sounds shape the neural representations of rhythmic input by boosting selective locking to the beat. Cortical activity was captured using electroencephalography (EEG) while participants listened to a regular rhythm or to a relatively complex syncopated rhythm conveyed either by low tones (130 Hz) or high tones (1236.8 Hz). We found that cortical activity at the frequency of the perceived beat is selectively enhanced compared with other frequencies in the EEG spectrum when rhythms are conveyed by bass sounds. This effect is unlikely to arise from early cochlear processes, as revealed by auditory physiological modeling, and was particularly pronounced for the complex rhythm requiring endogenous generation of the beat. The effect is likewise not attributable to differences in perceived loudness between low and high tones, as a control experiment manipulating sound intensity alone did not yield similar results. Finally, the privileged role of bass sounds is contingent on allocation of attentional resources to the temporal properties of the stimulus, as revealed by a further control experiment examining the role of a behavioral task. Together, our results provide a neurobiological basis for the convention of using bass instruments to carry the rhythmic foundations of music and to drive people to move to the beat

    Dynamic modulation of beta band cortico-muscular coupling induced by audio-visual rhythms

    Get PDF
    Human movements often spontaneously fall into synchrony with auditory and visual environmental rhythms. Related behavioral studies have shown that motor responses are automatically and unintentionally coupled with external rhythmic stimuli. However, the neurophysiological processes underlying such motor entrainment remain largely unknown. Here we investigated with electroencephalography (EEG) and electromyography (EMG) the modulation of neural and muscular activity induced by periodic audio and/or visual sequences. The sequences were presented at either 1 Hz or 2 Hz while participants maintained constant finger pressure on a force sensor. The results revealed that although there was no change of amplitude in participants’ EMG in response to the sequences, the synchronization between EMG and EEG recorded over motor areas in the beta (12–40 Hz) frequency band was dynamically modulated, with maximal coherence occurring about 100 ms before each stimulus. These modulations in beta EEG–EMG motor coherence were found for the 2 Hz audio-visual sequences, confirming at a neurophysiological level the enhancement of motor entrainment with multimodal rhythms that fall within preferred perceptual and movement frequency ranges. Our findings identify beta band cortico-muscular coupling as a potential underlying mechanism of motor entrainment, further elucidating the nature of the link between sensory and motor systems in humans

    Does movement amplitude of a co-performer affect individual performance in musical synchronization?

    Get PDF
    Interpersonal coordination in musical ensembles often involves multisensory cues, with visual information about body movements supplementing co-performers’ sounds. Previous research on the influence of movement amplitude of a visual stimulus on basic sensorimotor synchronization has shown mixed results. Uninstructed visuomotor synchronization seems to be influenced by amplitude of a visual stimulus, but instructed visuomotor synchronization is not. While music performance presents a special case of visually mediated coordination, involving both uninstructed (spontaneously coordinating ancillary body movements with co-performers) and instructed (producing sound on a beat) forms of synchronization, the underlying mechanisms might also support rhythmic interpersonal coordination in the general population. We asked whether visual cue amplitude would affect nonmusicians’ synchronization of sound and head movements in a musical drumming task designed to be accessible regardless of musical experience. Given the mixed prior results, we considered two competing hypotheses. H1: higher amplitude visual cues will improve synchronization. H2: different amplitude visual cues will have no effect on synchronization. Participants observed a human-derived motion capture avatar with three levels of movement amplitude, or a still image of the avatar, while drumming along to the beat of tempo-changing music. The moving avatars were always timed to match the music. We measured temporal asynchrony (drumming relative to the music), predictive timing, ancillary movement fluctuation, and cross-spectral coherence of ancillary movements between the participant and avatar. The competing hypotheses were tested using conditional equivalence testing. This method involves using a statistical equivalence test in the event that standard hypothesis tests show no differences. Our results showed no statistical differences across visual cues types. Therefore, we conclude that there is not a strong effect of visual stimulus amplitude on instructed synchronization

    Neural tracking of visual periodic motion

    Get PDF
    Periodicity is a fundamental property of biological systems, including human movement systems. Periodic movements support displacements of the body in the environment as well as interactions and communication between individuals. Here, we use electroencephalography (EEG) to investigate the neural tracking of visual periodic motion, and more specifically, the relevance of spatiotemporal information contained at and between their turning points. We compared EEG responses to visual sinusoidal oscillations versus nonlinear Rayleigh oscillations, which are both typical of human movements. These oscillations contain the same spatiotemporal information at their turning points but differ between turning points, with Rayleigh oscillations having an earlier peak velocity, shown to increase an individual's capacity to produce accurately synchronized movements. EEG analyses highlighted the relevance of spatiotemporal information between the turning points by showing that the brain precisely tracks subtle differences in velocity profiles, as indicated by earlier EEG responses for Rayleigh oscillations. The results suggest that the brain is particularly responsive to velocity peaks in visual periodic motion, supporting their role in conveying behaviorally relevant timing information at a neurophysiological level. The results also suggest key functions of neural oscillations in the Alpha and Beta frequency bands, particularly in the right hemisphere. Together, these findings provide insights into the neural mechanisms underpinning the processing of visual periodic motion and the critical role of velocity peaks in enabling proficient visuomotor synchronization

    huSync : a model and system for the measure of synchronization in small groups : a case study on musical joint action

    Get PDF
    Human communication entails subtle non-verbal modes of expression, which can be analyzed quantitatively using computational approaches and thus support human sciences. In this paper we present huSync, a computational framework and system that utilizes trajectory information extracted using pose estimation algorithms from video sequences to quantify synchronization between individuals in small groups. The system is exploited to study interpersonal coordination in musical ensembles. Musicians communicate with each other through sounds and gestures, providing nonverbal cues that regulate interpersonal coordination. huSync was applied to recordings of concert performances by a professional instrumental ensemble playing two musical pieces. We examined effects of different aspects of musical structure (texture and phrase position) on interpersonal synchronization, which was quantified by computing phase locking values of head motion for all possible within-group pairs. Results indicate that interpersonal coupling was stronger for polyphonic textures (ambiguous leadership) than homophonic textures (clear melodic leader), and this difference was greater in early portions of phrases than endings (where coordination demands are highest). Results were cross-validated against an analysis of audio features, showing links between phase locking values and event density. This research produced a system, huSync, that can quantify synchronization in small groups and is sensitive to dynamic modulations of interpersonal coupling related to ambiguity in leadership and coordination demands, in standard video recordings of naturalistic human group interaction. huSync enabled a better understanding of the relationship between interpersonal coupling and musical structure, thus enhancing collaborations between human and computer scientists

    What movement force reveals about cognitive processes in music performance

    No full text
    Expressive music performance requires exquisite movement timing and force control. Although measures of timing and force are both potentially informative about the cognitive processes underlying such performance, force control has received relatively little attention in empirical studies of music. This chapter begins with a historical review of research on timing and force in music performance. The claim that force deserves greater attention is justified by appealing to musicians' intuitions and research in kinesiology and sports science. Two studies that highlight the benefits of examining force control are discussed. The results of the first study suggest that forces applied during movement can be used to gauge the relative roles of auditory and motor imagery during musical action planning. The second study suggests that irregularities in force control are associated with fluctuations in certainty about upcoming actions. It is concluded that measures of force control go some way towards revealing the contents of a performer's action plans and the degree of (un)certainty in their musical intentions
    • …
    corecore